359 research outputs found

    The use of fluorescence to investigate the factors leading to complex formation between naphthalenes and ß-cyclodextrins

    Get PDF
    Fraiji et al. determined a binding constant (K) of 581 = 6 M-1 for the 1:1 complex between 2-acetylnaphthalene (2-AN) and β-cyclodextrin (β-CD) in H20 using fluorescence quenching experiments (Appl. Spectrosc. 1994, 48, 79). Molecular modeling experiments indicate the possibility of complex stabilization resulting from hydrogen bonds between the C=O group of 2-AN and the -OH groups on the rim of β-CD. To check this possibility, we measured the K of the 2-AN:trimethyl-β-CD (TM-β- CD) complex, where all rim -OH groups are converted to –OCH3. In this case, K decreases to 134 M-1. To show that this is due to reduced hydrogen bonding capability rather than steric influence, the K of 2-AN: β-CD was measured at high pH. K decreases to 276 M-1 at pH 13.5, and an inflection point is observed at pH 12.2 (which is the pKa of rim -OH groups). Therefore, our experiments confirm the significance of 2-AN carbonyl hydrogen bonding to β-CD -OH groups. Fluorescence studies were also performed with 1,5-DNSA, 2,6-MANS, and 2,6- TNS. K was 151M-1 for 1,5-DNSA: β-CD and 105M-1 for 1,5-DNSA:TM-β-CD. The smaller K for the TM-β-CD complex is probably due to steric effects associated with the rim -OCH, groups. Thermodynamic properties of the 2,6-MANS:TM-β-CD complex were investigated. ΔH and ΔS were calculated as -35 kJ/mol and -47 J/mol K, respectively. Catena and Bright calculated ΔH as -6.7 kJ/mol and AS as 50 J/mol K for the 2,6-MANS: β-CD complex (Anal. Chem. 1989, 61, 915). The methyl groups of the TM-β-CD make the cavity environment more hydrophobic than the natural β -CD, creating much stronger guest:host interations. The methyl groups also restrict the motion of 2,6-MANS, which overcomes the favored entropic effects of solvent replacement and solvent shell disturbance. The 2,6-TNS:TM-β-CD, K was calculated as 2358 M-1, compared to Catena and Bright\u27s 1980 M-1 for the β-CD complex. The more hydrophobic cavity of TM-β-CD compensates for the loss of hydrogen bonding between the guest SO3- group and CD rim -OH groups. CE was employed for the separation of LSD (d-lysergic acid diethylamide), LAMPA (D-lysergic acid methylpropylamide) and iso-LSD using several types of cyclodextrins as buffer additives. A sample consisting of LSD, LAMPA, and iso-LSD was partially resolved (R\u3e1) with 60 mM α-CD in a 50 mM acetate buffer, pH 4.03, at 15°C. Migration times were under 9 min. Another solution consisting of both 60 mM α-CD and 0.2 mM sulfated-β-CD in 50 mM acetate buffer, pH 4.03, at 25°C provided baseline separation (R\u3e 1.5) of the sample mixture in a similar 9 min. time window. The reproducibility of migration times for both methods was 4-5% between days and 1% on the same day. Partial separation of LSD and LAMPA (R~1) was (observed in a 9 min. time window using 15 mM sulfated-α-CO in a 50 mM borate buffer, pH 8.51 at 25°C. Between day and same day reproducibilities were 2-3% and 1%, respectively

    Automated classification of Persistent Scatterers Interferometry time series

    Get PDF
    We present a new method for the automatic classification of Persistent Scatters Interferometry (PSI) time series based on a conditional sequence of statistical tests. Time series are classified into distinctive predefined target trends, such as uncorrelated, linear, quadratic, bilinear and discontinuous, that describe different styles of ground deformation. Our automatic analysis overcomes limits related to the visual classification of PSI time series, which cannot be carried out systematically for large datasets. The method has been tested with reference to landslides using PSI datasets covering the northern Apennines of Italy. The clear distinction between the relative frequency of uncorrelated, linear and non-linear time series with respect to mean velocity distribution suggests that different target trends are related to different physical processes that are likely to control slope movements. The spatial distribution of classified time series is also consistent with respect the known distribution of flat areas, slopes and landslides in the tests area. Classified time series enhances the radar interpretation of slope movements at the site scale, pointing out significant advantages in comparison with the conventional analysis based solely on the mean velocity. The test application also warns against potentially misleading classification outputs in case of datasets affected by systematic errors. Although the method was developed and tested to investigate landslides, it should be also useful for the analysis of other ground deformation processes such as subsidence, swelling/shrinkage of soils, or uplifts due to deep injections in reservoirs

    Simulation-based flood fragility and vulnerability analysis for expanding cities

    Get PDF
    Accurately quantifying flood-induced impacts on buildings and other infrastructure systems is essential for risk-sensitive planning and decision-making in expanding urban regions. Flood-induced impacts are directly related to the physical components of assets damaged due to contact with water. Such components include building contents (e.g., appliances, furniture) and other non-structural components whose damage/unavailability can severely impact the buildingsï¾’ functionality. Conventional fragility analysis approaches for flooding do not account for the physical damage to the individual components, mostly relying on empirical methods based on historical data. However, recent studies proposed simulation-based, assembly-based fragility models that account for the damage to the building components. Such fragility models require developing detailed inventories of vulnerable components of households and identifying building archetypes to be considered in a building portfolio for the region of interest. Content inventories and building portfolios have so far been obtained for specific socio-economic contexts such as the United States of America. However, building types and their content can significantly differ between countries, making the available fragility models and computational frameworks unsuitable for flood vulnerability analysis in rapidly expanding cities characterised by extensive informal settlements, such as low- and middle-income countries. This paper details how to adapt the available methodologies for flood vulnerability assessment to the context of formal and informal settlements of expanding cities in the global south. It also details the development of content inventories for households in these cities using field surveys. The proposed survey is deployed in various areas vulnerable to floods in Kathmandu, Nepal. Based on the survey results, each component within the household is associated with a corresponding flood capacity (resistance) distribution (in terms of water height and flood duration). These distributions are then employed in a simulation-based probabilistic framework to obtain fragility relationship and consequence models. The relevant differences between the results obtained in this study and those from previous studies are then investigated for a case-study building type. In addition, the influence of socio-economic factors (e.g., household income) and past flood experience (possibly resulting in various flood-risk mitigation strategies at a household level) on the resulting flood impacts is also included in the model

    Imaging technologies in the differential diagnosis and follow-up of brown tumor in primary hyperparathyroidism: case report and review of the literature

    Get PDF
    Brown tumors are osteolytic lesions associated with hyperparathyroidism (HPT). They may involve various skeletal segments, but rarely the cranio-facial bones. We report a case of a young boy with a swelling of the jaw secondary to a brown tumor presenting as the first manifestation of primary HPT (PHPT). He was found to have brown tumor located in the skull, as well. Different imaging technologies were employed for the diagnosis and follow-up after parathyroidectomy. We enclose a review of the literature on the employment of such imaging technologies in the differential diagnosis of osteolytic lesions. A multidisciplinary approach comprising clinical, laboratory and imaging findings is essential for the differential diagnosis of brown tumor in PHPT

    Activity and safety of RAD001 (everolimus) in patients affected by biliary tract cancer progressing after prior chemotherapy: a phase II ITMO study.

    Get PDF
    BACKGROUND: Biliary tract cancer (BTC) is a highly lethal disease for which the best available therapy remains undetermined. The mammalian target of rapamycin (mTOR) pathway is up-regulated in several cancers, including BTC, and preclinical evidence indicates that mTOR inhibition may be effective in the treatment of BTC. We sought to evaluate the activity and tolerability of the mTOR inhibitor RAD001-everolimus-in patients with BTC progressing after prior chemotherapy. PATIENTS AND METHODS: This was an open-label, single-arm, phase II study (EUDRACT 2008-007152-94) conducted in eight sites in Italy. Patients with locally advanced, metastatic or recurrent BTC progressing despite previous chemotherapy received a daily oral dose of everolimus 10 mg administered continuously in 28-day cycles. The two primary end points were disease control rate (DCR) and objective response rate (ORR). Secondary end points were progression-free survival (PFS), overall survival (OS) and time-to-progression (TTP). RESULTS: Thirty-nine patients were enrolled. The DCR was 44.7%, and the ORR was 5.1%. One patient showed a partial response at 2 months and one patient showed a complete response sustained up to 8 months. The median (95% confidence interval) PFS was 3.2 (1.8-4.0) months, and the median OS was 7.7 (5.5-13.2) months. The median TTP was 2.0 (1.7-3.7) months. Most common toxicities were asthenia (43.6%), thrombocytopenia (35.9%), pyrexia (30.8%) and erythema, mainly of mild-to-moderate severity. Two patients required dose reduction due to adverse events. CONCLUSION: Everolimus demonstrated a favourable toxicity profile and encouraging anti-tumour activity. Further trials are needed to establish the role of everolimus in the treatment of BTC. EUDRACT 2008-007152-94

    Kupffer Cells Hasten Resolution of Liver Immunopathology in Mouse Models of Viral Hepatitis

    Get PDF
    Kupffer cells (KCs) are widely considered important contributors to liver injury during viral hepatitis due to their pro-inflammatory activity. Herein we utilized hepatitis B virus (HBV)-replication competent transgenic mice and wild-type mice infected with a hepatotropic adenovirus to demonstrate that KCs do not directly induce hepatocellular injury nor do they affect the pathogenic potential of virus-specific CD8 T cells. Instead, KCs limit the severity of liver immunopathology. Mechanistically, our results are most compatible with the hypothesis that KCs contain liver immunopathology by removing apoptotic hepatocytes in a manner largely dependent on scavenger receptors. Apoptotic hepatocytes not readily removed by KCs become secondarily necrotic and release high-mobility group box 1 (HMGB-1) protein, promoting organ infiltration by inflammatory cells, particularly neutrophils. Overall, these results indicate that KCs resolve rather than worsen liver immunopathology

    Beyond the Hype: A Real-World Evaluation of the Impact and Cost of Machine Learning-Based Malware Detection

    Full text link
    There is a lack of scientific testing of commercially available malware detectors, especially those that boast accurate classification of never-before-seen (i.e., zero-day) files using machine learning (ML). The result is that the efficacy and gaps among the available approaches are opaque, inhibiting end users from making informed network security decisions and researchers from targeting gaps in current detectors. In this paper, we present a scientific evaluation of four market-leading malware detection tools to assist an organization with two primary questions: (Q1) To what extent do ML-based tools accurately classify never-before-seen files without sacrificing detection ability on known files? (Q2) Is it worth purchasing a network-level malware detector to complement host-based detection? We tested each tool against 3,536 total files (2,554 or 72% malicious, 982 or 28% benign) including over 400 zero-day malware, and tested with a variety of file types and protocols for delivery. We present statistical results on detection time and accuracy, consider complementary analysis (using multiple tools together), and provide two novel applications of a recent cost-benefit evaluation procedure by Iannaconne & Bridges that incorporates all the above metrics into a single quantifiable cost. While the ML-based tools are more effective at detecting zero-day files and executables, the signature-based tool may still be an overall better option. Both network-based tools provide substantial (simulated) savings when paired with either host tool, yet both show poor detection rates on protocols other than HTTP or SMTP. Our results show that all four tools have near-perfect precision but alarmingly low recall, especially on file types other than executables and office files -- 37% of malware tested, including all polyglot files, were undetected.Comment: Includes Actionable Takeaways for SOC

    AI ATAC 1: An Evaluation of Prominent Commercial Malware Detectors

    Full text link
    This work presents an evaluation of six prominent commercial endpoint malware detectors, a network malware detector, and a file-conviction algorithm from a cyber technology vendor. The evaluation was administered as the first of the Artificial Intelligence Applications to Autonomous Cybersecurity (AI ATAC) prize challenges, funded by / completed in service of the US Navy. The experiment employed 100K files (50/50% benign/malicious) with a stratified distribution of file types, including ~1K zero-day program executables (increasing experiment size two orders of magnitude over previous work). We present an evaluation process of delivering a file to a fresh virtual machine donning the detection technology, waiting 90s to allow static detection, then executing the file and waiting another period for dynamic detection; this allows greater fidelity in the observational data than previous experiments, in particular, resource and time-to-detection statistics. To execute all 800K trials (100K files ×\times 8 tools), a software framework is designed to choreographed the experiment into a completely automated, time-synced, and reproducible workflow with substantial parallelization. A cost-benefit model was configured to integrate the tools' recall, precision, time to detection, and resource requirements into a single comparable quantity by simulating costs of use. This provides a ranking methodology for cyber competitions and a lens through which to reason about the varied statistical viewpoints of the results. These statistical and cost-model results provide insights on state of commercial malware detection
    • …
    corecore